ShiftDelete.Net Global

Nvidia faces price hike pressure as Samsung HBM4 costs double

Ana sayfa / AI

Nvidia may be forced to raise prices on its next-gen GPUs and AI chips, thanks to skyrocketing costs tied to new memory tech. Samsung’s upcoming HBM4 modules, boasting 3.3 TB/s bandwidth, will reportedly cost Nvidia over twice as much as the current generation a sharp shift that could ripple across the gaming and AI landscape by 2026.

Samsung has managed to nearly triple the bandwidth of its new HBM4 memory compared to HBM3E. These 36GB modules now hit 3.3 TB/s thanks to a redesigned stacking architecture and advanced signal correction techniques. But that power won’t come cheap.

Industry insiders say Nvidia is set to pay Samsung over $500 per HBM4 module, more than double the ~$250 price it charged for HBM3E. SK Hynix is reportedly charging even more around $550 largely due to higher production costs tied to TSMC’s base die supply.

With demand for AI computing exploding, Nvidia doesn’t have much leverage. “Nvidia’s demand for HBM4 is so high that Samsung Electronics has no choice but to secure its supply at a high price,” sources revealed.

Huawei MateBook Pro debuts in striking Dawn Pink finish

Huawei adds Dawn Pink to its MateBook Pro color lineup, exclusive to the 32GB + 1TB model with optional PaperMatte display.

As Nvidia, Google, and Meta pour billions into AI infrastructure, they’re outbidding PC and gaming hardware makers on memory orders. Epic CEO Tim Sweeney recently warned that this trend could price premium gaming PCs out of reach. His comment came after RAM prices doubled within a month, with some users paying $500 for kits that cost half as much weeks earlier.

This crunch isn’t isolated to HBM. SK Hynix has also confirmed blistering specs for its new GDDR7 and LPDDR6 memory:

These next-gen modules will power both AI inference workloads and high-end gaming graphics cards. But if current trends hold, gamers may face GPU price tags that creep even higher in 2026.

Nvidia’s AI chips and flagship GPUs such as its next-gen H100 and RTX Titan-class products will likely be the first to adopt HBM4. With a mid-$500 module price and demand showing no signs of slowing, Nvidia will almost certainly pass the costs downstream.

Here’s what’s unfolding:

Nvidia isn’t just buying faster memory, it’s buying at any price to stay ahead in AI. That leaves PC users and gamers caught in the middle. If Samsung’s HBM4 module really does cost twice as much and delivers triple the bandwidth, Nvidia will deliver more power than ever but not without making wallets flinch.

The AI race is hot. And with memory prices rising fast, so are the stakes.

Yorum Ekleyin